Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-1550] Fixed - Successive creation of spark context fails in pyspark, if the previous initialization of spark context had failed. #478

Closed
wants to merge 1 commit into from

Conversation

prabinb
Copy link

@prabinb prabinb commented Apr 22, 2014

No description provided.

…spark, if the previous initialization of spark context had failed.
@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@mateiz
Copy link
Contributor

mateiz commented Apr 24, 2014

Jenkins, this is ok to test

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished. All automated tests passed.

@AmplabJenkins
Copy link

All automated tests passed.
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/14432/

pwendell added a commit to pwendell/spark that referenced this pull request May 12, 2014
SPARK-1033. Ask for cores in Yarn container requests

Tested on a pseudo-distributed cluster against the Fair Scheduler and observed a worker taking more than a single core.
@JoshRosen
Copy link
Contributor

Hi @prabinb,

Thanks for submitting this PR. This issue has been fixed by #1606, so do you mind closing this? Thanks!

@asfgit asfgit closed this in 2c35666 Jul 30, 2014
andrewor14 pushed a commit to andrewor14/spark that referenced this pull request Jan 8, 2015
SPARK-1033. Ask for cores in Yarn container requests

Tested on a pseudo-distributed cluster against the Fair Scheduler and observed a worker taking more than a single core.
(cherry picked from commit 576c4a4)

Signed-off-by: Patrick Wendell <pwendell@gmail.com>
mccheah pushed a commit to mccheah/spark that referenced this pull request Feb 14, 2019
yifeih pushed a commit to yifeih/spark that referenced this pull request Mar 5, 2019
This PR reverts back to using Scala 2.11

* Revert "Fix distribution publish to scala 2.12 apache#478"
* Revert "[SPARK-25956] Make Scala 2.12 as default Scala version in Spark 3.0"
bzhaoopenstack pushed a commit to bzhaoopenstack/spark that referenced this pull request Sep 11, 2019
Check for branch master before docker push
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants